VC Dimension Bounds for Higher-Order Neurons
نویسنده
چکیده
We investigate the sample complexity for learning using higher-order neurons. We calculate upper and lower bounds on the Vapnik-Chervonenkis dimension and the pseudo dimension for higher-order neurons that allow unrestricted interactions among the input variables. In particular, we show that the degree of interaction is irrelevant for the VC dimension and that the individual degree of the variables plays only a minor role. Further, our results reveal that the crucial parameters that aaect the VC dimension of higher-order neurons are the input dimension and the maximum number of occurrences of each variable. The lower bounds that we establish are asymptotically almost tight. In particular, they show that the VC dimension in super-linear in the input dimension. Bounds for higher-order neurons with sigmoidal activation function are also derived.
منابع مشابه
VC dimension bounds for networks of spiking neurons
We calculate bounds on the VC dimension and pseudo dimension for networks of spiking neurons. The connections between network nodes are parameterized by transmission delays and synaptic weights. We provide bounds in terms of network depth and number of connections that are almost linear. For networks with few layers this yields better bounds than previously established results for networks of
متن کاملVC Dimension Bounds for Product Unit Networks
A product unit is a formal neuron that multiplies its input values instead of summing them. Furthermore, it has weights acting as exponents instead of being factors. We investigate the complexity of learning for networks containing product units. We establish bounds on the Vapnik-Chervonenkis (VC) dimension that can be used to assess the generalization capabilities of these networks. In particu...
متن کاملOn the Complexity of Computing and Learning with Multiplicative Neural Networks
In a great variety of neuron models, neural inputs are combined using the summing operation. We introduce the concept of multiplicative neural networks that contain units that multiply their inputs instead of summing them and thus allow inputs to interact nonlinearly. The class of multiplicative neural networks comprises such widely known and well-studied network types as higher-order networks ...
متن کاملRadial Basis Function Neural Networks Have Superlinear VC Dimension
We establish superlinear lower bounds on the Vapnik-Chervonen-kis (VC) dimension of neural networks with one hidden layer and local receptive eld neurons. As the main result we show that every reasonably sized standard network of radial basis function (RBF) neurons has VC dimension (W log k), where W is the number of parameters and k the number of nodes. This signiicantly improves the previousl...
متن کاملNeural Networks with Local Receptive Fields and Superlinear VC Dimension
Local receptive field neurons comprise such well-known and widely used unit types as radial basis function (RBF) neurons and neurons with center-surround receptive field. We study the Vapnik-Chervonenkis (VC) dimension of feedforward neural networks with one hidden layer of these units. For several variants of local receptive field neurons, we show that the VC dimension of these networks is sup...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999